In-situ animal behavior classification using knowledge distillation and fixed-point quantization

نویسندگان

چکیده

We explore the use of knowledge distillation (KD) for learning compact and accurate models that enable classification animal behavior from accelerometry data on wearable devices. To this end, we take a deep complex convolutional neural network, known as residual network (ResNet), teacher model. ResNet is specifically designed multivariate time-series classification. to distill datasets into soft labels, which consist predicted pseudo-probabilities every class each datapoint. then labels train our significantly less student models, are based gated recurrent unit (GRU) multilayer perceptron (MLP). The evaluation results using two real-world show accuracy GRU-MLP improves appreciably through KD, approaching further reduce computational memory requirements performing inference trained via utilize dynamic fixed-point quantization (DQ) an appropriate modification graph considered models. implement both unquantized quantized versions developed KD-based embedded systems purpose-built collar ear tag devices classify in situ real time. Our evaluations corroborate effectiveness KD DQ improving efficiency in-situ

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fixed Point Quantization of Deep Convolutional Networks

In recent years increasingly complex architectures for deep convolution networks (DCNs) have been proposed to boost the performance on image recognition tasks. However, the gains in performance have come at a cost of substantial increase in computation and model storage resources. Fixed point implementation of DCNs has the potential to alleviate some of these complexities and facilitate potenti...

متن کامل

Some fixed point theorems and common fixed point theorem in log-convex structure

Some fixed point theorems and common fixed point theorem in Logarithmic convex structure areproved.

متن کامل

Model compression via distillation and quantization

Deep neural networks (DNNs) continue to make significant advances, solving tasks from image classification to translation or reinforcement learning. One aspect of the field receiving considerable attention is efficiently executing deep models in resource-constrained environments, such as mobile or embedded devices. This paper focuses on this problem, and proposes two new compression methods, wh...

متن کامل

Coefficient Quantization Error Free Fixed-point Iir Polynomial Predictor Design

In this paper, roundoff noise properties of fixed-point IIR polynomial predictors (FIPPs) and polynomial-predictive differentiators (FIPPDs) are investigated. These filters are designed by augmenting the corresponding FIR basis filters with magnitude response shaping feedbacks. Here we use ideally quantized coefficient (coefficient quantization error free) polynomial FIR predictors (PFPs) or pr...

متن کامل

efl learners collocational knowledge in writing and translation

این مطالعه به بررسی دانش همنشین های دانشجویان زبان انگلیسی فارسی زبان می پردازد. در ابتدا، ارتباط یین دانش عمومی زبان و دانش همنشین های زبان آموزان مورد بررسی قرار می گیرد. دومین هدف این مطالعه بررسی استراتژیهای مورد استفاده در ترجمه همنشین های انگلیسی به فارسی است. در نهایت، خطا های زبان آموزان در تولید و درک همنشین ها مورد آنالیز و بررسی قرار می گیرد. دویست و بیست هفت دانشجوی زبان در این مطال...

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Smart agricultural technology

سال: 2023

ISSN: ['2772-3755']

DOI: https://doi.org/10.1016/j.atech.2022.100159